Path: blob/master/Part 3 - Classification/Support Vector Machine/[Python] Support Vector Machine.ipynb
1009 views
Kernel: Python 3
Support Vector Machine
Data preprocessing
In [26]:
In [27]:
In [16]:
Out[16]:
In [28]:
In [19]:
Out[19]:
array([[ 27, 57000],
[ 46, 28000],
[ 39, 134000],
[ 44, 39000],
[ 57, 26000],
[ 32, 120000],
[ 41, 52000],
[ 48, 74000],
[ 26, 86000],
[ 22, 81000]])
In [20]:
Out[20]:
array([[ 46, 22000],
[ 59, 88000],
[ 28, 44000],
[ 48, 96000],
[ 29, 28000],
[ 30, 62000],
[ 47, 107000],
[ 29, 83000],
[ 40, 75000],
[ 42, 65000]])
In [22]:
Out[22]:
array([0, 1, 1, 0, 1, 1, 0, 1, 0, 0])
In [23]:
Out[23]:
array([0, 1, 0, 1, 0, 0, 1, 0, 0, 0])
In [30]:
Fitting SVM classifier to the Training set
In [31]:
Out[31]:
SVC(C=1.0, cache_size=200, class_weight=None, coef0=0.0,
decision_function_shape='ovr', degree=3, gamma='auto', kernel='linear',
max_iter=-1, probability=False, random_state=42, shrinking=True,
tol=0.001, verbose=False)
Predicting the Test set results
In [32]:
In [13]:
Out[13]:
array([0, 1, 0, 1, 0, 0, 1, 0, 0, 0])
In [14]:
Out[14]:
array([0, 1, 0, 1, 0, 0, 1, 0, 0, 0])
Making the Confusion Matrix
In [33]:
Out[33]:
array([[50, 2],
[ 9, 19]])
classifier made 50 + 19 = 69 correct prediction and 9 + 2 = 11 incoreect predictions.
Visualising the Training set results
In [35]:
Out[35]:
Visualising the Test set results
In [37]:
Out[37]:
So there is not much change as compare to the Logistic Regression